Neuromorphic Sensors in Robotics
I study the characteristics of event cameras as perception sensors for field robotics, as an alternative to conventional sensors (LiDARs, frame-based cameras).
Catching Fast Moving Objects with Events #
Humans (and other animals) are very good at reacting fast based on visual information. In this short video, the pitcher is catching a ball flying at 90 m/s! We tried to replicate some of these results using event cameras.
Our system was capable of estimating the trajectory of the ball in mid-air, and move a robotic net to catch it. This happens quite fast: we only have 300 ms to capture data, perform calculations, and execute a one-shot motion. Our system was able to capture balls at 14 m/s!
High-altitude Orthomapping with Event Cameras #
Orthomapping is the process of generating a map by stitching multiple aerial pictures. Traditionally performed with frame-based cameras, this process is vulnerable to problems such as bright areas and shadowy spots. The goal was to explore how event cameras would perform for this particular task, as they are less vulnerable to these problems due to their higher dynamic range.
We found that fusing events with RGB images helped improve the number of reconstructed pixels in an orthomosaic, particularly in challenging areas of the images.
This project also helped us develop a platform for high-altitude event camera experiments, that we will be using soon!
M3ED: High-resolution, Multi-modal, Multi-environment Event Dataset #
High resolution event cameras enable significant improvements in mapping. This short clip shows the difference between MVSEC and this work, M3ED.
We developed a standard sensor package featuring two high-resolution event cameras, two grayscale global shutter cameras, one global shutter RGB camera, a temperature-compensated IMU and a 3D LiDAR. This platform was mounted in three different platforms: a UAV, a car, and Spot robot. We collected sequences indoor settings, urban, and forests.